3D Gaussian Splatting from Hollywood Films!

Поделиться
HTML-код
  • Опубликовано: 28 сен 2024
  • Turning movie shots into 3D scenes using 3D Gaussian Splatting. We found movie clips online, converted scenes to image sequences, trained their GSPLATS, and brought them to UNREAL ENGINE 5.
    3D Gaussian Splatting for Real-Time Radiance Field Rendering Paper:
    repo-sam.inria...
    Join our discord server:
    / discord
    If you wanna see us to do cool things follow us here too:
    Instagram: / badxstudio
    Twitter: / badxstudio
    TikTok: / badxstudio
    LinkedIn: / badxstudio
    Bad Decisions Podcast 🎙️:
    podcasters.spo...
    Our personal handles: (if you wanna stalk us)
    / farhad_sh
    / farazshababs
    / farhads__
    / farazshababi
    / farhadshababi
    / farazshababi
    #unrealengine5 #3dgaussiansplatting #3dscan #gaussiansplatting #3d #3drender #nerf #unrealengine #blender3d #blender #drone #ai #photogrammetry #vfx #cgi #film

Комментарии • 452

  • @shelfrelated
    @shelfrelated 11 месяцев назад +162

    THIS IS WAYYY TOO INTERESTING TO MISS OUT. And Faraz casually being confidently wrong in 1980 being 33 years ago (when it's actually 43) is another highlight.

  • @pavelf2002
    @pavelf2002 11 месяцев назад +46

    Imagine how cool it could be with 4DGaussians? It is a recent paper about gaussian splats, but with motion. Yes, you can recreate 3d dynamic scenes from videos with this technique!

    • @pva5142
      @pva5142 11 месяцев назад +8

      yea code is already out!

    • @irfanadamm5819
      @irfanadamm5819 11 месяцев назад +1

      so CGI will look more realistic now?

    • @pva5142
      @pva5142 11 месяцев назад +5

      @@irfanadamm5819 its basically better for filmmakers in post. once the fidelity gets really good, you’re eliminating a lot of the DP’s job outside of lighting. If you’re able to change the focal length, camera movement and composition in post

    • @badxstudio
      @badxstudio  11 месяцев назад +15

      We saw the paper... and we are going to try it for sure, we just have 2 more experiments with 3DGS, VR and Video Games

    • @MrGhostfaceLives
      @MrGhostfaceLives 11 месяцев назад +2

      @@pva5142 Why would they even want to do that? It's completely inorganic to do that, and the "fix it in post" mentality is gradually being forced out, thankfully. Also "outside of lighting"... so only the most important element of the job.

  • @der_alte_z0cker339
    @der_alte_z0cker339 11 месяцев назад +112

    2:18 it is even 43 years old. 😉 Wow! Great video as always guys!

    • @koberko
      @koberko 11 месяцев назад +10

      I wish It was "just" 33 years ago :-D I born in 1981, so I wouldn't mind 10y shaved off :-D

    • @willbrand77
      @willbrand77 11 месяцев назад +4

      Not the best math 😅

    • @badxstudio
      @badxstudio  11 месяцев назад

      hahahha

    • @badxstudio
      @badxstudio  11 месяцев назад

      Oooppsss ..Quick Math

    • @foxy2348
      @foxy2348 11 месяцев назад

      haha The math was amazing!

  • @ORTyOW
    @ORTyOW 11 месяцев назад +9

    2:33 guys are living in the 2013😅go grab that bitcoin asap

    • @badxstudio
      @badxstudio  11 месяцев назад +2

      hahahha I wish bro

  • @JaapvanderVelde
    @JaapvanderVelde 11 месяцев назад +4

    You can tell Gaussian Splatting is simple to do because even obnoxious loud people who can't spell 'Shining' can do it.

  • @duytdl
    @duytdl 10 месяцев назад +5

    This is awfully like lucid dreams if you've had them. Very similar restrictions, like you can't go beyond a certain point where it becomes like an invisible border of less details or something obstructing your view

    • @badxstudio
      @badxstudio  10 месяцев назад

      Wait really?? We’ve never had a lucid dream unfortunately. Is that truly how it feels?

    • @luiginotcool
      @luiginotcool 10 месяцев назад

      I’ve never had a Lucid dream that had a barrier. Usually I can go wherever I like

    • @kevinsimpore2044
      @kevinsimpore2044 5 месяцев назад

      ​@@luiginotcool if you used to fly in lucid dream you would have eventually noticed that you can't go beyond a certain altitude.

  • @Desopolis
    @Desopolis 10 месяцев назад +3

    What tools are you using to generate the splats?

  • @zsigmondforianszabo4698
    @zsigmondforianszabo4698 11 месяцев назад +19

    You guys are soooooo underrated.. the videos are entertaining and educational. Good job!

    • @RicardoGonzalez-gu5vv
      @RicardoGonzalez-gu5vv 11 месяцев назад

      They are the next corridor crew for the era of AI 3D vfx

    • @wakopaco
      @wakopaco 11 месяцев назад +2

      content is good, but their style is very annoying. cant stand all the forced wooooooows

    • @badxstudio
      @badxstudio  11 месяцев назад

      Thanks so much buddy! Glad you enjoy them

  • @ferspirit9733
    @ferspirit9733 3 месяца назад

    Man the editing of this video is pure art! And the energy of both of you is contagious, you really got me into unreal 💪🏻

  • @joelface
    @joelface 11 месяцев назад +4

    Thanks for trying out movies. I was so excited to see how that would turn out. It's a little trickier than I thought it would be, even with some pretty perfect scenes (limited movement within the scene and rotating camera). Video games into VR is a great next idea, since a rotating camera is as simple as moving the right joystick in most cases. I'd also like to see if you can figure out how to edit the capture, delete unnecessary artifacts, etc. Is that possible?

    • @badxstudio
      @badxstudio  11 месяцев назад +1

      In our UE video, we showed how you can crop these scenes but we still don't have a precision tool for deleting each ellipsoid in UE (devs have already made it for Unit)

  • @randfur
    @randfur 11 месяцев назад +4

    In the Trinity shot did you notice the doubling was gone? Also it was squashed, it's like the 3D point cloud stage undid the doubling by squashing them together causing the whole scene to get squashed horizontally.

    • @badxstudio
      @badxstudio  11 месяцев назад

      Yeah we realized that too but in general it made it difficult for the algorithm to arrange the point clouds.

  • @chrisgutierrez8599
    @chrisgutierrez8599 11 месяцев назад +9

    Great stuff guys always entertaining and learning while watching 🙌

    • @badxstudio
      @badxstudio  11 месяцев назад +2

      Chris that's the goal ma man

    • @chrisgutierrez8599
      @chrisgutierrez8599 11 месяцев назад +1

      Always blessed especially when these videos drop 💯

    • @badxstudio
      @badxstudio  11 месяцев назад +1

      Good to hear that G

  • @Sharivari
    @Sharivari 11 месяцев назад +2

    Just finished the full video. Your video production, knowledge, camera presence and humour will make your channel blow up.

    • @badxstudio
      @badxstudio  11 месяцев назад +1

      Wowwww thank u so much buddy

  • @Anttisinstrumentals
    @Anttisinstrumentals 11 месяцев назад +1

    First thought in my head. I think NEO bullet scene was the second thing I tried after I installed the software. Did not come out so well either. Gaussian Splatting can be used from now on for scenes like this for better effect.

  • @DaffyDuckTheWizzard
    @DaffyDuckTheWizzard 10 месяцев назад +1

    I think the cool way I would use it would be to build 3d models from miniatures and sculptures for added realism, texture and flavor

  • @DJ-Illuminate
    @DJ-Illuminate 11 месяцев назад

    Imagine taking one of those RUclips videos where someone filmed san fran or LA in 1920 and with the forward motion create a 3D splat or 4D splat. Somehow the forward video would have to be interpreted as side view for at least 180 view.

  • @hanitouati
    @hanitouati 11 месяцев назад +2

    2:30 nice math bro

  • @Uhfgood
    @Uhfgood 11 месяцев назад +1

    What's kind of cool, is even though AI is used to generate these, you can use some other AI to replace the missing "footage" -- for instance trees and mountain behind. (Sure it wouldn't necessarily be exactly what's actually behind the hotel, but it could generate lost information to give you a reasonable facsimile of what might haave been behind it

    • @badxstudio
      @badxstudio  11 месяцев назад

      We were talking about it after watching Adobe Max, it would be dope to try the video in painting feature. This is definitely something that eventually will be part of the process.

  • @selftransforming5768
    @selftransforming5768 11 месяцев назад

    YALL ARLREADY ON A WHOLE NEW ERA!

    • @badxstudio
      @badxstudio  11 месяцев назад

      Glad you think so mate

  • @dylanyesenofski5516
    @dylanyesenofski5516 11 месяцев назад

    that particular shot in the shining is from the timberline lodge in mount hood Oregon, USA

  • @BrentLeVasseur
    @BrentLeVasseur 11 месяцев назад +1

    That shot from the Shining is taken at the top of Mt. Hood ski resort in Oregon. There is a glacier there where you can ski even in the summer time. When I skied there, I instantly recognized the lodge as the one used in all the exterior shots in the Shining.

    • @badxstudio
      @badxstudio  11 месяцев назад

      Oh that is so cool to hear!! How was the ski experience there tho?

    • @BrentLeVasseur
      @BrentLeVasseur 11 месяцев назад

      @@badxstudioI trained there with the Swedish and Korean national teams during the summer of 1991. Glacier skiing in the summer isn’t the best when compared to normal winter powder skiing, but its good for training on the off season.

  • @ORTyOW
    @ORTyOW 11 месяцев назад +2

    That's crazy, very impressing!

  • @DejayClayton
    @DejayClayton 11 месяцев назад +1

    I wonder how good these results would be if you exactly mirrored the actual video camera path and tracking, and then created 3d models from the point-in-time gaussian splat viewpoints that intersect across all frames.

    • @badxstudio
      @badxstudio  11 месяцев назад

      That is a cool idea in theory! we will have to test that to see if it will actually work tho!

    • @DejayClayton
      @DejayClayton 11 месяцев назад

      @@badxstudio keep me updated!

  • @BeachcomberFilms
    @BeachcomberFilms 11 месяцев назад +1

    You guys beat me to it!!! I'm wondering how Gaussian Splatting would go with stereoscopic side by side video at full frame.

  • @Rocksteady72a
    @Rocksteady72a 11 месяцев назад +1

    Listening to this on 10% volume and it still feels like I'm going to get a noise complaint

    • @badxstudio
      @badxstudio  11 месяцев назад

      Hahahahahahahah 😜

  • @red4666
    @red4666 8 месяцев назад

    can the gsplats cast and/or receive shadows? like if you put a cube in the middle of the table in that one scene and cast a light, will the table receive a shadow?

  • @DJ-Illuminate
    @DJ-Illuminate 11 месяцев назад

    If I were going to have a talk with host and guest I assume I would do a 360 of them in their chairs first before a 2D video interview was started in order to create a 4D splat of the 2D video? How would you do this?

  • @Edbrad
    @Edbrad 10 месяцев назад +1

    Loads of Jerry Bruckheimer films you could have used.

  • @Sharivari
    @Sharivari 11 месяцев назад +1

    That's so cool!

    • @badxstudio
      @badxstudio  11 месяцев назад

      Couldn't agree more!

  • @r.m8146
    @r.m8146 11 месяцев назад +4

    Please, do it in VR.

    • @badxstudio
      @badxstudio  11 месяцев назад +1

      How did you guess the next video title? :D :D :D :D :D

    • @sabomarius128
      @sabomarius128 11 месяцев назад

      Yes!

  • @GranulatedStuff
    @GranulatedStuff 10 месяцев назад

    holy moly, turn it down a few notches lads !

  • @mystica-subs
    @mystica-subs 10 месяцев назад

    The weirdness you get from trinity in the matrix is framerate/telecine conversion errors. A properly inverse-telecine'd dvd or non-rate-adjusted 23.976 bluray source would be the ideal source for this shot.

  • @larswillsen
    @larswillsen 10 месяцев назад +1

    Did he say the Shining helicopter scene was shot 20 years ago? .. Ir's actually almost 50 years ago (approx. 43 - 45) .. :-)

  • @Marco-xz7rf
    @Marco-xz7rf 10 месяцев назад

    Hey can you please try this on apollo 11 or other missions? I tried it with apollo 11, but only with the images. But they aren't overlapping enough :( Is it possible somehow to tell the 3DGS, where each photo was taken and the angle? would that help?

  • @leptok3736
    @leptok3736 11 месяцев назад +1

    Haha nice, I was thinking of something related, can you generate enough consistent ai images to create a 3d gaussian splat of something novel, mine was The Space Needle rising out of a foggy forest instead of in the city, but I couldn't get enough consistency to generate a point cloud. Someday soon maybe.

    • @joelface
      @joelface 11 месяцев назад

      You would need some drone footage of the space needle, I think. What were you using?

    • @leptok3736
      @leptok3736 11 месяцев назад

      @@joelface yeah I was trying to learn Colmap, I was thinking of you could get enough images of roughly the same object, it might be able to stitch something together. Idk if I'm fundamentally misunderstanding that process or the images I had weren't consistent enough.
      Does it need something like footage where you can get dense frames?

    • @joelface
      @joelface 11 месяцев назад +1

      @@leptok3736 Footage is best because it's all the same exposure, same lens, same time of day, and then consistent path of depth for the software to build around. If you're just uploading a bunch of random images of the same object, there could be a lot of differences that the software can't figure out what to do with.

  • @shilze1
    @shilze1 11 месяцев назад

    I think static and long views of fully CG scenes are probably the best bets of getting anything out of this.
    Which is at least interesting for the idea that you could then take old somewhat subpar CG shots and recreate the scenes with the same camera movement. At least until it's possible to account for the moving elements completely.
    Video games are cool and all as a use case but most of them are already 3d and can have their assets ripped, so the only real use case with that is putting funny things inside the scenes.

  • @qbert4325
    @qbert4325 11 месяцев назад +1

    At this point I am not even surprised to see what it can do

    • @badxstudio
      @badxstudio  11 месяцев назад

      Hahahahahaha I know right!

  • @thevfxmancolorizationvfxex4051
    @thevfxmancolorizationvfxex4051 11 месяцев назад +1

    Would it work for scenes where cameras move across or remain stationary? I'm thinking of using this method to colorize black and white footage, so I'm just wondering

    • @badxstudio
      @badxstudio  11 месяцев назад +2

      Camera needs to be moving to show clear depth! As of now this Gaussian Splatting tech works on finding points of interest, and if you are not moving your camera the right way you might fail your training!

    • @thevfxmancolorizationvfxex4051
      @thevfxmancolorizationvfxex4051 11 месяцев назад

      @@badxstudio All I need really is for the scenes to be rendered exactly how they are for what I want to do, as I'd like to edit each object individually

    • @badxstudio
      @badxstudio  11 месяцев назад

      I'm sure it's gonna be doable! You just have to test it out!@@thevfxmancolorizationvfxex4051

    • @joelface
      @joelface 11 месяцев назад

      @@thevfxmancolorizationvfxex4051 I think there are other methods you could use for recolorizing old footage. I don't think gsplats will be the right way to do so, especially with stationary camera.. it can't figure out the depth of a scene without multiple viewpoints. It doesn't create any data that isn't already in the video.

  • @Kenb3d1
    @Kenb3d1 11 месяцев назад +1

    Brilliant idea.👍

  • @dannyd4339
    @dannyd4339 10 месяцев назад

    "That movie was made in the 80s, that was 33 years ago" what. But then again, you can tell these guys aren't the brightest tool in the shed. They are marveling at something your phone has been able to do for a decade now

  • @Mowgi
    @Mowgi 11 месяцев назад +1

    43 years ago, my friends 😅

    • @badxstudio
      @badxstudio  11 месяцев назад

      Hahahaha fml … we failed maths

  • @Appleloucious
    @Appleloucious 11 месяцев назад

    One Love!
    Always forward, never ever backward!!
    ☀☀☀
    💚💛❤
    🙏🏿🙏🙏🏼

  • @TheFachen
    @TheFachen 10 месяцев назад

    this technique would be great for an overhead ring of cameras or partial ring of cameras synchronized similar to the matrix shots - not to take the single output video from per se as you did, but for the filmmakers being able to relight the scene in post

  • @fnytnqsladcgqlefzcqxlzlcgj9220
    @fnytnqsladcgqlefzcqxlzlcgj9220 11 месяцев назад

    Lol for that first shot you have yo undo the anamorphic squeeze or it looks funny

  • @sissyphussartre2907
    @sissyphussartre2907 11 месяцев назад +2

    1980 was 43 years ago

  • @Edbrad
    @Edbrad 10 месяцев назад +1

    3:42 now consider the next leap forward where it will intelligently create all the parts we never had a camera record.

  • @LutzTeichmann
    @LutzTeichmann 11 месяцев назад

    looks like a visualization of memorys

  • @Ardi_0
    @Ardi_0 11 месяцев назад

    This is mindblowing

  • @thronosstudios
    @thronosstudios 11 месяцев назад +1

    Imagine this thing being able to record every frame properly (instead of a few) and you could literally pause in any of them and look at them from any angle without losing detail. If anything, they should be able to fill in the blanks with AI

    • @badxstudio
      @badxstudio  11 месяцев назад +1

      Yeah that's the thought! We like to believe that very soon AI will b able to help in generating the missing detail and create a complex scene that looks stunning!

  • @3Ddex
    @3Ddex 11 месяцев назад +2

    I really love the work you guys do. Thank you ❤

  • @HongPong
    @HongPong 11 месяцев назад

    you guys crazy for this one. Stanley Kubricks ghost is going to be up and snooping 👻🎥🧿

    • @badxstudio
      @badxstudio  11 месяцев назад

      hahaha hope he is not angry.. we love his movies

  • @SHolmesS
    @SHolmesS 11 месяцев назад +1

    How much could this change 3D films in VR?

    • @badxstudio
      @badxstudio  11 месяцев назад +1

      We are brining these 3D scenes into VR in the next video. Gonna check the performance and quality ;)

  • @MrLyonliang
    @MrLyonliang 10 месяцев назад

    amazing ideas!

  • @DJ-Illuminate
    @DJ-Illuminate 11 месяцев назад

    One more thing. Say I want to scan a building with a drone. Could I take interior shots and combine them for an outside and inside model? I assume once loaded into Unity or whatever you could piece the separate splats together.

  • @metedev
    @metedev 11 месяцев назад

    Good work!

    • @badxstudio
      @badxstudio  11 месяцев назад +1

      Thank you! Cheers!

  • @ORTyOW
    @ORTyOW 11 месяцев назад

    13:46 it's Trinity calling, because you called her

    • @badxstudio
      @badxstudio  11 месяцев назад +1

      OH SHIT!!!! WE JUST REALISED THAT hahahahahha imagine if we added that to the script

    • @ORTyOW
      @ORTyOW 11 месяцев назад

      @@badxstudio ahahaha

  • @elliotmarks06
    @elliotmarks06 11 месяцев назад

    How did you convert the Gaussians to geometry to work in unreal engine?

    • @badxstudio
      @badxstudio  11 месяцев назад

      we used a plugin to convert them into Niagara Particles.

  • @hamidvisuals
    @hamidvisuals 11 месяцев назад

    Fantastic stuff, I had no clue about this technology until I came across your channel guys. Parcham Balast

  • @arshputz
    @arshputz 11 месяцев назад

    you blew my mind at Inglorious Basterds came out 15 years ago

  • @juanjosoliz7297
    @juanjosoliz7297 11 месяцев назад +3

    43 years

    • @badxstudio
      @badxstudio  11 месяцев назад

      ooopppss sorry hahah

  • @3dMistri
    @3dMistri 11 месяцев назад

    this is better than HDRI and can be used for behind objects, with time it will get better. but how it is different from photogrammtery ??

    • @badxstudio
      @badxstudio  11 месяцев назад

      Well as for photogrammetry you get geometry with actual polygons which can be useful for things such as collision! There are other aspects to consider too of course :)

  • @mersalmassgamer4348
    @mersalmassgamer4348 10 месяцев назад

    Bro this is crazy, future flim making is gonna bring up more creative in pictures 🔥

  • @felixgeen6543
    @felixgeen6543 11 месяцев назад

    video games will be a lot easier since: no lens distortion, no lens blur, no motion blur, no lens aberrations, more FPS... basically just easier to do structure from motion on video game footage in almost every way, except for the HUD which would need to be removed.

    • @badxstudio
      @badxstudio  11 месяцев назад

      We are testing that too! Exactly it’s gonna be much simpler! Thinking about cyberpunk maybe for the test! Any recommendations?

  • @DJ-Illuminate
    @DJ-Illuminate 11 месяцев назад

    The Neo shot works but you need like a splat editor to remove the artifacts.

  • @jumpieva
    @jumpieva 8 месяцев назад

    consider this is tech in it's infancy, imagine where we'll be 1 or 5 years from now.

  • @lukesfilmltd
    @lukesfilmltd 11 месяцев назад

    looking awesome!

  • @thomasreynolds6305
    @thomasreynolds6305 11 месяцев назад

    So I have been using Luma interactive scenes. Its pretty good but I know you all mention in a previous video that you can get more control with the method your using by letting it go through more variations. I have a Gpu with a 4090. Do you think its worth doing the setup to get the extra fidelity? Also how long are you all usually training a splat?

  • @elliotmarks06
    @elliotmarks06 11 месяцев назад

    Super cool! I've been wanting to try the same thing with the matrix and NeRFs for a while now.

    • @badxstudio
      @badxstudio  11 месяцев назад

      Right???? it's super exciting!!

  • @changgou8339
    @changgou8339 11 месяцев назад

    there is a scene from sherlock holmes. where Watson is getting married. everything is frozen and it was shot from different angles. you can probably make a good 3d scene out of it.

    • @badxstudio
      @badxstudio  11 месяцев назад

      The movie or TV Series? tell us exactly which one and we are going to do it haha

  • @tekanobob
    @tekanobob 10 месяцев назад +1

    dude 1980 - 2023 = 43 years ago

    • @badxstudio
      @badxstudio  10 месяцев назад

      Hahhahah yeah we realised that :D

  • @SuperSupergeo
    @SuperSupergeo 11 месяцев назад

    Why did you guys not cut out the top and bottom widescreen bars? Great video otherwise, though, very interesting use of Gaussian Splats and Novel View Synthesis techniques in general, though!

  • @krisserold134
    @krisserold134 11 месяцев назад

    Y'all are CRAZY and I love it

    • @badxstudio
      @badxstudio  11 месяцев назад

      hahaha thanks buddy

  • @pedxing
    @pedxing 11 месяцев назад

    VLC FTW! and if you wanted to know... the Shining LOCATION was filmed at the Stanley Hotel in Estes Park Colorado. =)

    • @pedxing
      @pedxing 11 месяцев назад

      also... 2023-1980=43 =)

    • @badxstudio
      @badxstudio  11 месяцев назад +1

      Have you been to the shining location?

    • @pedxing
      @pedxing 11 месяцев назад +1

      yep! stayed there. stayed in room 217 even. =) hiked the mountains around it, swam naked in the pool at 2am, walked the stairs, listened to ghost stories, had an AMAZING time. also... turns out the stairs in the Stanley are ALSO where they shot the scene in Dumb and Dumber where the boys are fighting trying to get to the top. @@badxstudio

  • @alvydasjokubauskas2587
    @alvydasjokubauskas2587 11 месяцев назад

    Ok guys I am convinced I am also going to do this stuff, hopefully I will write a worthy research paper about Gaussian Splatting.

    • @badxstudio
      @badxstudio  11 месяцев назад

      Go for it mate.. you won't regret it

  • @felixgeen6543
    @felixgeen6543 11 месяцев назад

    You guys even explained yourselves why the Batman shot doesnt work: its zooming. changing focal lengths are not supported by COLMAP. You would need to solve the variable focal length in a software like Syntheyes and remove the focal length by scaling the image down inside the frame as the footage zooms in. this would also require you to remove the variable lens distortion present in most zoom lenses. its a complex thing to do.

    • @badxstudio
      @badxstudio  11 месяцев назад

      Yessir!! U got that right 🙏🫡

  • @HamalgamUB
    @HamalgamUB 11 месяцев назад

    You're videos are great

    • @badxstudio
      @badxstudio  11 месяцев назад

      We appreciate that!

  • @poloo92
    @poloo92 11 месяцев назад

    That's actually awsome !! Gg 👏👏 it brings some much exploration to enjoy, understand a scene.... The setting, the composition, the atmosphere.... Super video, actually as you said in a VR viewer it would be craaazy

    • @badxstudio
      @badxstudio  11 месяцев назад +1

      Thanks a lot! Glad you liked it buddy...yeah VR is next

  • @matthewpublikum3114
    @matthewpublikum3114 11 месяцев назад

    What's the hardware specs you're using?

  • @Phyzx420
    @Phyzx420 11 месяцев назад

    This reminds me of Cyberpunk 2077's braindance tech. If you can recreate scenes from movies, you can do the same from a vlog so crazy. Digital forensics 😯.

    • @badxstudio
      @badxstudio  11 месяцев назад +1

      yess!! EXACTLY!! Braindance was so cooolll ... can't wait for Gaussian Splats to be dynamic

  • @salt806
    @salt806 11 месяцев назад

    Keep ‘em coming!

  • @digitalzenith6527
    @digitalzenith6527 11 месяцев назад

    Can you run a mesh to metahuman on a head out of this? that would be ultra dope.

  • @RaulFrizado
    @RaulFrizado 11 месяцев назад

    Good shlt.very interesting.
    Someday you guys will get a viral +1M video. You deserve.

    • @badxstudio
      @badxstudio  11 месяцев назад

      YESSSIRRRRR!!! And when we do ... we shall celebrate together

  • @BradleySmith1985
    @BradleySmith1985 11 месяцев назад

    and turn them into 3d vr movies?

  • @JosephWraith
    @JosephWraith 11 месяцев назад +1

    Bad ass!

  • @ZipZapTesla
    @ZipZapTesla 4 месяца назад

    This will open a new realm of shitposts

  • @MrTPGuitar
    @MrTPGuitar 11 месяцев назад

    Have you tried to "de-blur" source images by restoring using AI? If we have some temporal adherence that would probably give some amazing results when fed into the pipeline

    • @badxstudio
      @badxstudio  11 месяцев назад

      We tested Topaz to enhance the photos and even though they looked better for some reason it did not work out to create better results.

  • @MrLarsalexander
    @MrLarsalexander 11 месяцев назад

    The Harry Potter clip with the Hogward Castle can be upscaled and given more detail using AI on each frame. Then the render would look really awesome!

    • @badxstudio
      @badxstudio  11 месяцев назад

      TBH we tried Topaz upscaling on another scan of ours and it didn't help with the training of that for some reason. We will have to try again for movies maybe it will work!!

  • @MrLarsalexander
    @MrLarsalexander 11 месяцев назад

    From the Inglorious bastards movie. What about selecting the actors from all frames, and make sure that they always are from one or more frames that they never move in?!
    Some PS Content-Aware Fill or AI to fill inn areas around them and on the actors that will might be gone or visble on those frame because the angle of the camera change.
    Perhaps you guys could use AI to generate pictures of them from those other angles to fake then not moving, but only the camera move?!
    Then try to render this scene again to avoid getting artifacts on them.

  • @literallykevin
    @literallykevin 11 месяцев назад

    Omg combining this with some of the stuff Adobe has been doing lately to "fill in the cracks and Upscale... You'll be able to pull flawless backgrounds, remove distractions, fix faces and other moving items automatically, add items, repose and recompose your shots with no effort at all.

    • @badxstudio
      @badxstudio  11 месяцев назад +1

      You said it best! Will definitely be trying that all out

    • @literallykevin
      @literallykevin 11 месяцев назад

      I'm sooo hype. You two have THE best energy!@@badxstudio

  • @johnw65uk
    @johnw65uk 11 месяцев назад

    A great use of AI would be to fill in the areas the camera cant see. Bit like photoshops AI generative fill for 3d.

    • @badxstudio
      @badxstudio  11 месяцев назад

      we were thinking about the same thing... it's coming for sure

  • @robertomachado7581
    @robertomachado7581 2 месяца назад

    I guess you mean "43 years ago"? ...Great math!

  • @SirPaulMuaddib
    @SirPaulMuaddib 11 месяцев назад

    Do you know what they will enscribe on your tombstone?
    "33 YEARS AGO"

    • @badxstudio
      @badxstudio  11 месяцев назад

      hahaha RUclips will never forgive us

  • @mackhavoc477
    @mackhavoc477 11 месяцев назад +1

    That math, though!! Try 43, lol

  • @MogulSuccess
    @MogulSuccess 11 месяцев назад

    Here's my pitch
    New Industry - Virtual Reality Movies | See "Ready Player One"
    Allow people to interact with old movies
    Mode A. Casual observers walking around the scene
    Mode B. "Karaoke Mode" - Protagonist say lines on cue and be in the scenes on cue - "a la" Just Dance Scoring points
    Mode C. Party Mode - friends play supporting characters
    Mode D. Multiverse Mode - GTA worlds of the entire movie available to explore RPG style where user can randomly switch Modes A - C

    • @badxstudio
      @badxstudio  11 месяцев назад +1

      Yo this is such a cool idea!! Totally can see this happening hahaha so cool

    • @MogulSuccess
      @MogulSuccess 11 месяцев назад

      @@badxstudioif you have the bandwidth I have some capital resources we could create a prototype and go from there

  • @Valkyrie9000
    @Valkyrie9000 10 месяцев назад +1

    Naturally, the plural of princess is princesses.
    "The principal had principles of a prince, but the princesses paid prices, and his processes preceded many crisis."
    It's a stupid made-up language.

    • @badxstudio
      @badxstudio  10 месяцев назад

      Lol thanks for clarifying that hahahaha

  • @amirnajafi-pro
    @amirnajafi-pro 11 месяцев назад

    Such a great content 👏🏻🔥🔝

  • @patfish3291
    @patfish3291 11 месяцев назад

    The limit is the Quality ;-)

    • @badxstudio
      @badxstudio  11 месяцев назад

      Give it some time, it will catch up

    • @patfish3291
      @patfish3291 11 месяцев назад

      @@badxstudio not for that use-case :-) only AI can create something that isn't in the original footage (resolution/occlusions) :-)

  • @caliberpokwana8872
    @caliberpokwana8872 11 месяцев назад

    I wanna do the Google maps and do the shoots my self

    • @badxstudio
      @badxstudio  11 месяцев назад

      U mean u want to gaussian the google maps in unreal?

  • @0rdyin
    @0rdyin 11 месяцев назад

    This is exactly like the braindance sequence from Cyberpunk 2077..

    • @badxstudio
      @badxstudio  11 месяцев назад

      PRECISELY!!!
      Just wait till you see us taking this to VR in the next video. We actually scaled the sprite sizes in the sequencer so they suddenly enlarge and create the scene, it looks so similar to braindance!

  • @Foxtrop13
    @Foxtrop13 11 месяцев назад

    if the shinning was filmed in 1980, wouldnt it be 43 years ago?

    • @badxstudio
      @badxstudio  11 месяцев назад

      yeahhh we fked that up :D :D Quick MATHSSSS